Journal article
Multi-step gradient methods for networked optimization
E Ghadimi, I Shames, M Johansson
IEEE Transactions on Signal Processing | Published : 2013
Abstract
We develop multi-step gradient methods for network-constrained optimization of strongly convex functions with Lipschitz-continuous gradients. Given the topology of the underlying network and bounds on the Hessian of the objective function, we determine the algorithm parameters that guarantee the fastest convergence and characterize situations when significant speed-ups over the standard gradient method are obtained. Furthermore, we quantify how uncertainty in problem data at design-time affects the run-time performance of the gradient method and its multi-step counterpart, and conclude that in most cases the multi-step method outperforms gradient descent. Finally, we apply the proposed techn..
View full abstractRelated Projects (1)
Grants
Funding Acknowledgements
This work was sponsored in part by the Swedish Research Council (VR) and The Swedish Foundation for Strategic Research (SSF).